Search results for "Visual integration"
showing 6 items of 6 documents
Multisensory integration of drumming actions: musical expertise affects perceived audiovisual asynchrony
2009
We investigated the effect of musical expertise on sensitivity to asynchrony for drumming point-light displays, which varied in their physical characteristics (Experiment 1) or in their degree of audiovisual congruency (Experiment 2). In Experiment 1, 21 repetitions of three tempos x three accents x nine audiovisual delays were presented to four jazz drummers and four novices. In Experiment 2, ten repetitions of two audiovisual incongruency conditions x nine audiovisual delays were presented to 13 drummers and 13 novices. Participants gave forced-choice judgments of audiovisual synchrony. The results of Experiment 1 show an enhancement in experts' ability to detect asynchrony, especially fo…
Issues Related to the Restoration of Mirrors of the Wooden paliotto della chiesa del Santissimo Crocifisso all’Albergheria, Sicily (Italy)
2017
ABSTRACTIn this work, the decision-making process involved in the restoration of the eighteenth century paliotto ligneo (wooden altar frontal) della chiesa del Santissimo Crocifisso all’Albergheria of Palermo is presented. Earlier research concerning mirror restoration was based on only a few case studies and the proposed techniques were not suitable for the artwork here. As a consequence, it was necessary to re-examine theories and protocols of modern restoration to plan an appropriate intervention of the altar frontal. Since in this artwork the role of mirrors is not to give back images, as usual, but rather to create special light effects and play of lights, this work aims to find an app…
Audiovisual processing of Chinese characters elicits suppression and congruency effects in MEG
2019
Learning to associate written letters/characters with speech sounds is crucial for reading acquisition. Most previous studies have focused on audiovisual integration in alphabetic languages. Less is known about logographic languages such as Chinese characters, which map onto mostly syllable-based morphemes in the spoken language. Here we investigated how long-term exposure to native language affects the underlying neural mechanisms of audiovisual integration in a logographic language using magnetoencephalography (MEG). MEG sensor and source data from 12 adult native Chinese speakers and a control group of 13 adult Finnish speakers were analyzed for audiovisual suppression (bimodal responses…
Top-Down Predictions of Familiarity and Congruency in Audio-Visual Speech Perception at Neural Level.
2019
During speech perception, listeners rely on multimodal input and make use of both auditory and visual information. When presented with speech, for example syllables, the differences in brain responses to distinct stimuli are not, however, caused merely by the acoustic or visual features of the stimuli. The congruency of the auditory and visual information and the familiarity of a syllable, that is, whether it appears in the listener’s native language or not, also modulates brain responses. We investigated how the congruency and familiarity of the presented stimuli affect brain responses to audio-visual (AV) speech in 12 adult Finnish native speakers and 12 adult Chinese native speakers. The…
Top-Down Predictions of Familiarity and Congruency in Audio-Visual Speech Perception at Neural Level
2019
During speech perception, listeners rely on multimodal input and make use of both auditory and visual information. When presented with speech, for example syllables, the differences in brain responses to distinct stimuli are not, however, caused merely by the acoustic or visual features of the stimuli. The congruency of the auditory and visual information and the familiarity of a syllable, that is, whether it appears in the listener’s native language or not, also modulates brain responses. We investigated how the congruency and familiarity of the presented stimuli affect brain responses to audio-visual (AV) speech in 12 adult Finnish native speakers and 12 adult Chinese native speakers. The…
Top-Down Predictions of Familiarity and Congruency in Audio-Visual Speech Perception at Neural Level
2019
During speech perception, listeners rely on multimodal input and make use of both auditory and visual information. When presented with speech, for example syllables, the differences in brain responses to distinct stimuli are not, however, caused merely by the acoustic or visual features of the stimuli. The congruency of the auditory and visual information and the familiarity of a syllable, that is, whether it appears in the listener's native language or not, also modulates brain responses. We investigated how the congruency and familiarity of the presented stimuli affect brain responses to audio-visual (AV) speech in 12 adult Finnish native speakers and 12 adult Chinese native speakers. The…